# Ancient Greek processing
Syllaberta
SyllaBERTa is an experimental Transformer-based masked language model specifically designed for processing Ancient Greek texts, employing syllable-level tokenization.
Large Language Model
Transformers Other

S
Ericu950
19
1
Ancient Greek BERT
The first and only available subword BERT model for Ancient Greek, achieving state-of-the-art fine-tuned performance in part-of-speech tagging and morphological analysis tasks.
Large Language Model
Transformers

A
pranaydeeps
214
14
Aristoberto
A Transformer model for low-resource Ancient Greek, initialized with pre-training on Greek BERT and further trained on Ancient Greek corpora
Large Language Model
Transformers Other

A
Jacobo
83
5
Featured Recommended AI Models